Distributed Derivative-free Learning Method for Stochastic Optimization over a Network with Sparse Activity

نویسندگان
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

An Accelerated Method for Derivative-Free Smooth Stochastic Convex Optimization

We consider an unconstrained problem of minimization of a smooth convex function which is only available through noisy observations of its values, the noise consisting of two parts. Similar to stochastic optimization problems, the first part is of a stochastic nature. On the opposite, the second part is an additive noise of an unknown nature, but bounded in the absolute value. In the two-point ...

متن کامل

Derivative free optimization method

Derivative free optimization (DFO) methods are typically designed to solve optimization problems whose objective function is computed by a “black box”; hence, the gradient computation is unavailable. Each call to the “black box” is often expensive, so estimating derivatives by finite differences may be prohibitively costly. Finally, the objective function value may be computed with some noise, ...

متن کامل

Sparse Learning for Stochastic Composite Optimization

In this paper, we focus on Stochastic Composite Optimization (SCO) for sparse learning that aims to learn a sparse solution. Although many SCO algorithms have been developed for sparse learning with an optimal convergence rate O(1/T ), they often fail to deliver sparse solutions at the end either because of the limited sparsity regularization during stochastic optimization or due to the limitat...

متن کامل

Inexact Restoration Method for Derivative-Free Optimization with Smooth Constraints

A new method is introduced for solving constrained optimization problems in which the derivatives of the constraints are available but the derivatives of the objective function are not. The method is based on the Inexact Restoration framework, by means of which each iteration is divided in two phases. In the first phase one considers only the constraints, in order to improve feasibility. In the...

متن کامل

A Derivative-Free Optimization Algorithm Using Sparse Grid Integration

We present a new derivative-free optimization algorithm based on the sparse grid numerical integration. The algorithm applies to a smooth nonlinear objective function where calculating its gradient is impossible and evaluating its value is also very expensive. The new algorithm has: 1) a unique starting point strategy; 2) an effective global search heuristic; and 3) consistent local convergence...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Transactions on Automatic Control

سال: 2021

ISSN: 0018-9286,1558-2523,2334-3303

DOI: 10.1109/tac.2021.3077516